Goto

Collaborating Authors

 mutual information


Supplementary Materials for: Max-Sliced Mutual Information A Proofs

Neural Information Processing Systems

A.1 Proof of Proposition 1 We note that 1 is restated and was proved in [25, Appendix A.1] Proof of 2: Non-negativity directly follows by non-negativity of mutual information. Proof of 5: The proof relies on the independence of functions of independent random variables. This concludes the proof. 1 A.2 Proof of Proposition 2 By translation invariance of mutual information, we may assume w.l.o.g. that the means are Next, we show that we may equivalently optimize with the added unit variance constraint. Example 3.4]), we have I (A B) null, where the last equality uses the unit variance property and Schur's determinant formula. Armed with Lemma 1, we are in place to prove Proposition 2. Since the CCA solutions Theorem 2.2], which is restated next for completeness.







DisDiff: Unsupervised Disentanglement of Diffusion Probabilistic Models Tao Y ang

Neural Information Processing Systems

DPMs, those inherent factors can be automatically discovered, explicitly represented, and clearly injected into the diffusion process via the sub-gradient fields. To tackle this task, we devise an unsupervised approach named DisDiff, achieving disentangled representation learning in the framework of DPMs.